A general class of nonlinear normalized adaptive filtering algorithms

نویسندگان

  • Sudhakar Kalluri
  • Gonzalo R. Arce
چکیده

The normalized least mean square (NLMS) algorithm is an important variant of the classical LMS algorithm for adaptive linear filtering. It possesses many advantages over the LMS algorithm, including having a faster convergence and providing for an automatic time-varying choice of the LMS stepsize parameter that affects the stability, steady-state mean square error (MSE), and convergence speed of the algorithm. An auxiliary fixed step-size that is often introduced in the NLMS algorithm has the advantage that its stability region (step-size range for algorithm stability) is independent of the signal statistics. In this paper, we generalize the NLMS algorithm by deriving a class of nonlinear normalized LMS-type (NLMS-type) algorithms that are applicable to a wide variety of nonlinear filter structures. We obtain a general nonlinear NLMS-type algorithm by choosing an optimal time-varying step-size that minimizes the next-step MSE at each iteration of the general nonlinear LMS-type algorithm. As in the linear case, we introduce a dimensionless auxiliary step-size whose stability range is independent of the signal statistics. The stability region could therefore be determined empirically for any given nonlinear filter type. We present computer simulations of these algorithms for two specific nonlinear filter structures: Volterra filters and the recently proposed class of Myriad filters. These simulations indicate that the NLMS-type algorithms, in general, converge faster than their LMS-type counterparts.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Family of Selective Partial Update Affine Projection Adaptive Filtering Algorithms

In this paper we present a general formalism for the establishment of the family of selective partial update affine projection algorithms (SPU-APA). The SPU-APA, the SPU regularized APA (SPU-R-APA), the SPU partial rank algorithm (SPU-PRA), the SPU binormalized data reusing least mean squares (SPU-BNDR-LMS), and the SPU normalized LMS with orthogonal correction factors (SPU-NLMS-OCF) algorithms...

متن کامل

Image Restoration with Two-Dimensional Adaptive Filter Algorithms

Two-dimensional (TD) adaptive filtering is a technique that can be applied to many image, and signal processing applications. This paper extends the one-dimensional adaptive filter algorithms to TD structures and the novel TD adaptive filters are established. Based on this extension, the TD variable step-size normalized least mean squares (TD-VSS-NLMS), the TD-VSS affine projection algorithms (...

متن کامل

A General Class of Nonlinear NormalizedAdaptive Filtering Algorithms 1

The Normalized Least Mean Square (NLMS) algorithm is an important variant of the classical LMS algorithm for adaptive linear ltering. It possesses many advantages over the LMS algorithm, including having a faster convergence and providing for an automatic time-varying choice of the LMS step-size parameter which aaects the stability , steady-state mean square error (MSE) and convergence speed of...

متن کامل

A General Class of Nonlinear Normalized LMS-type Adaptive Algorithms

The Normalized Least Mean Square (NLMS) algorithm is an important variant of the classical LMS algorithm for adaptive linear FIR ltering. It provides an automatic choice for the LMS step-size parameter which aaects the stability, convergence speed and steady-state performance of the algorithm. In this paper, we generalize the NLMS algorithm by deriving a class of Nonlinear Normalized LMS-type (...

متن کامل

A fully adaptive normalized nonlinear gradient descent algorithm for complex-valued nonlinear adaptive filters

A fully adaptive normalized nonlinear complex-valued gradient descent (FANNCGD) learning algorithm for training nonlinear (neural) adaptive finite impulse response (FIR) filters is derived. First, a normalized nonlinear complex-valued gradient descent (NNCGD) algorithm is introduced. For rigour, the remainder of the Taylor series expansion of the instantaneous output error in the derivation of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 47  شماره 

صفحات  -

تاریخ انتشار 1999